AE: Contextual AR Experience
Description
Create an interactive mobile AR experience, focusing on spatial awareness, tracking, interaction, and environmental integration. Make the AR experience context-aware to augment a real or simulated physical object, emphasizing reliable registration and meaningful user interaction on handheld devices.
Goals
- Configure and develop an AR Foundation mobile project (iOS or Android).
- Implement spatial awareness (planes, occlusion) and persistent augmentation through image tracking.
- Design and enable user interaction with augmented content (touch-based).
- Ensure visual consistency of virtual content with real-world lighting cues.
- Communicate design intent and robustness through documentation.
Requirements
Your AR application must demonstrate all three of the following core capabilities:
1. Spatial Awareness & Occlusion:
- Detect planes (or simulate them in XR Simulation).
- Implement occlusion so virtual objects respect real geometry when possible.
2. Image Tracking:
- Use image tracking to spawn objects.
- Augmentation should remain stable as the user moves.
3. Augmented Content Interaction:
- Include at least one meaningful touch-based interaction with your virtual content (e.g., toggling a component, revealing info, repositioning, or triggering a change).
A full mobile deployment is not required if XR Simulation is used.
Testing With XR Simulation
If you do not have access to means to deploy your app to your mobile device, you may use Unity’s XR Simulation to test your AR experience. A full tutorial is provided at the end of this assignment.
Your final demonstration video may be:
- A screen recording of XR Simulation in the Unity Editor, or
- A screen recording of your mobile build running on device.
Graduate Extension
Graduate students must complete an extension that connects recent AR research to their project. This consists of two parts (plus an optional evaluation):
- Short Literature Review:
- Select 3–4 peer-reviewed papers from reputable AR/XR venues (e.g., IEEE ISMAR, IEEE VR, ACM CHI, UIST, DIS).
- Recommended themes: tracking stability, anchoring drift, multimodal interaction, context-aware AR, industrial AR workflows, spatial consistency, or mobile AR robustness.
- For each paper (≤250 words), briefly summarize:
- The engineering or AR problem addressed
- The technique, approach, or insight the authors contribute
- How the findings can inform the design of a more stable or meaningful mobile AR experience
- Research-Informed Enhancement Proposal:
- Propose one enhancement to your AR experience inspired by the reviewed literature.
- The enhancement can be conceptual or partially implemented.
- Write a Research Memo (≤3 pages) that:
- Connects literature insights to your idea
- Describes the intended behavior, design, or integration
- Explains how it would improve robustness, stability, or user perception
- Notes expected limitations
Graduate extension adds up to 25 additional points to the assignment score.
Deliverable
1. A Video Demonstration:
- Either a screen recording of your app running in XR Simulation, or a recording of your mobile device running the application. This video must clearly show:
- Spatial awareness & occlusion
- Image tracking or anchoring
- Touch-based interaction
2. Performance/Design Reflection Document (≤ 2 pages):
- How spatial awareness & occlusion were implemented
- How tracking/anchoring works
- How your interactions were designed
- How you addressed or observed tracking degradation
- Any limitations or design compromises
- Graduate Extension:
- Literature review (3–4 papers)
- Research-informed enhancement memo
- Optional mini evaluation
Guidelines
- Submissions must be made individually.
- Submit the video recording, Reflection PDF, and (if applicable) Graduate Memo via Canvas.
- If using image tracking, include the reference image inside the PDF.
- XR Simulation is an acceptable testing method if you lack device access.
- Keep content focused and stable; avoid unnecessary complexity.
Grading Rubric
Undergraduate Core (100 points)
| Criterion | Description | Points |
|---|---|---|
| Spatial Awareness & Occlusion | Plane detection and occlusion implemented reliably. | 30 |
| Image Tracking | Stable augmentation and appropriate handling of movement. | 30 |
| Interaction & Feedback | Touch interaction works; provides responsive feedback. | 20 |
| Video Demonstration | Clearly shows required features, either via device or XR Simulation. | 10 |
| Performance/Design Reflection | Clear explanation of design choices, degradation handling, and limitations. | 10 |
Graduate Extension Add-On (25 points)
| Criterion | Description | Points |
|---|---|---|
| Literature Review Quality | Strong summaries connected to mobile AR engineering. | 8 |
| Enhancement Proposal | Insightful design grounded in literature. | 9 |
| Optional Evaluation Reflection | Analysis of robustness tests tied back to readings. | 8 |
XR Simulation Tutorial
XR Simulation lets you test AR scenes directly inside Unity without needing a physical device. Students can preview tracking, image detection, environment interactions, and more—right in the Editor.
- Enable XR Simulation:
- Open
Edit > Project Settings. - Select
XR Plug-in Management. - Enable
XR SimulationunderProviders.
- Open
- Select an XR Simulation Environment:
- Open
Window > XR > AR Foundation > XR Environment. - Use the
Environmentdropdown to select an environment. - (Optional) Click Install sample environments for richer test spaces.
- Open
- Run Your AR Scene in Simulation:
- Your AR scene must include
AR SessionandXR Origin. - Open the scene.
- Click
Playto run the simulation.
- Your AR scene must include
- Navigation Controls:
- Right Mouse Button + Move Mouse → Look around
- W / A / S / D → Move
- Q / E → Down / Up
- Shift → Move faster
- Editing or Creating Simulation Environments (Optional):
- Use the
Create/Edit Environmentdropdown to create, duplicate, or modify an environment. - To convert your own prefab:
- Add a
Simulation Environmentcomponent to the root object. - Refresh the list:
Assets > Refresh XR Environment List.
- Use the
- Simulated AR Elements (Optional):
- Simulated Tracked Image – test image tracking
- Simulated Bounding Box – test object detection
- Simulated Environment Probe / Light – test lighting behavior
- Simulated Anchor – test persistent anchors
- X-Ray Region – view indoor scenes more easily
Useful Links:
- XR Simulation: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/simulation.html
- AR Foundation Manual: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/index.html
- Tracked Images: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/tracked-image-manager.html
- Unity Learn (AR Foundation): https://learn.unity.com/search?k=%5B%22q%3Aar+foundation%22%5D